Smoothed Performance Guarantees for Local Search

نویسندگان

  • Tobias Brunsch
  • Heiko Röglin
  • Cyriel Rutten
  • Tjark Vredeveld
چکیده

We study popular local search and greedy algorithms for standard machine scheduling problems. The performance guarantee of these algorithms is well understood, but the worst-case lower bounds seem somewhat contrived and it is questionable whether they arise in practical applications. To find out how robust these bounds are, we study the algorithms in the framework of smoothed analysis, in which instances are subject to some degree of random noise. While the lower bounds for all scheduling variants with restricted machines are rather robust, we find out that the bounds are fragile for unrestricted machines. In particular, we show that the smoothed performance guarantee of the jump and the lex-jump algorithm are (in contrast to the worst case) independent of the number of machines. They are Θ(φ) and Θ(log φ), respectively, where 1/φ is a parameter measuring the magnitude of the perturbation. The latter immediately implies that also the smoothed price of anarchy is Θ(log φ) for routing games on parallel links. Additionally, we show that for unrestricted machines also the greedy list scheduling algorithm has an approximation guarantee of Θ(log φ).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smoothed Analysis of Local Search Algorithms

Smoothed analysis is a method for analyzing the performance of algorithms for which classical worst-case analysis fails to explain the performance observed in practice. Smoothed analysis has been applied to explain the performance of a variety of algorithms in the last years. One particular class of algorithms where smoothed analysis has been used successfully are local search algorithms. We gi...

متن کامل

Parameterized Complexity of DPLL Search Procedure

Zusammenfassung: We study the performance of DPLL algorithms on parameterized problems. In particular, we investigate how difficult it is to decide whether small solutions exist for satisfiability and other combinatorial problems. For this purpose we develop a Prover-Delayer game which models the running time of DPLL procedures and we establish an information-theoretic method to obtain lower bo...

متن کامل

Multi-objective Differential Evolution for the Flow shop Scheduling Problem with a Modified Learning Effect

This paper proposes an effective multi-objective differential evolution algorithm (MDES) to solve a permutation flow shop scheduling problem (PFSSP) with modified Dejong's learning effect. The proposed algorithm combines the basic differential evolution (DE) with local search and borrows the selection operator from NSGA-II to improve the general performance.  First the problem is encoded with a...

متن کامل

On the Smoothed Approximation Ratio of the 2-Opt Heuristic for the TSP

The 2-Opt heuristic is a simple, easy-to-implement local search heuristic for the traveling salesman problem. While it usually provides good approximations to the optimal tour in experiments, its worst-case performance is poor. In an attempt to explain the approximation performance of 2-Opt, we prove an upper bound of exp(O( √ log(1/σ))) for the smoothed approximation ratio of 2-Opt. As a lower...

متن کامل

No bad local minima: Data independent training error guarantees for multilayer neural networks

We use smoothed analysis techniques to provide guarantees on the training loss of Multilayer Neural Networks (MNNs) at differentiable local minima. Specifically, we examine MNNs with piecewise linear activation functions, quadratic loss and a single output, under mild over-parametrization. We prove that for a MNN with one hidden layer, the training error is zero at every differentiable local mi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011